35 research outputs found
A finite simulation method in a non-deterministic call-by-need calculus with letrec, constructors and case
The paper proposes a variation of simulation for checking and proving contextual equivalence in a non-deterministic call-by-need lambda-calculus with constructors, case, seq, and a letrec with cyclic dependencies. It also proposes a novel method to prove its correctness. The calculus' semantics is based on a small-step rewrite semantics and on may-convergence. The cyclic nature of letrec bindings, as well as non-determinism, makes known approaches to prove that simulation implies contextual equivalence, such as Howe's proof technique, inapplicable in this setting. The basic technique for the simulation as well as the correctness proof is called pre-evaluation, which computes a set of answers for every closed expression. If simulation succeeds in finite computation depth, then it is guaranteed to show contextual preorder of expressions
Simulation in the call-by-need lambda-calculus with letrec
This paper shows the equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in the deterministic call-by-need lambda calculus with letrec. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations. Although this property may be a natural one to expect, to the best of our knowledge, this paper is the first one providing a proof. The proof technique is to transfer the contextual approximation into Abramsky's lazy lambda calculus by a fully abstract and surjective translation. This also shows that the natural embedding of Abramsky's lazy lambda calculus into the call-by-need lambda calculus with letrec is an isomorphism between the respective term-models.We show that the equivalence property proven in this paper transfers to a call-by-need letrec calculus developed by Ariola and Felleisen
Counterexamples to simulation in non-deterministic call-by-need lambda-calculi with letrec
This note shows that in non-deterministic extended lambda calculi with letrec, the tool of applicative (bi)simulation is in general not usable for contextual equivalence, by giving a counterexample adapted from data flow analysis. It also shown that there is a flaw in a lemma and a theorem concerning finite simulation in a conference paper by the first two authors
Simulation in the Call-by-Need Lambda-Calculus with Letrec, Case, Constructors, and Seq
This paper shows equivalence of several versions of applicative similarity
and contextual approximation, and hence also of applicative bisimilarity and
contextual equivalence, in LR, the deterministic call-by-need lambda calculus
with letrec extended by data constructors, case-expressions and Haskell's
seq-operator. LR models an untyped version of the core language of Haskell. The
use of bisimilarities simplifies equivalence proofs in calculi and opens a way
for more convenient correctness proofs for program transformations. The proof
is by a fully abstract and surjective transfer into a call-by-name calculus,
which is an extension of Abramsky's lazy lambda calculus. In the latter
calculus equivalence of our similarities and contextual approximation can be
shown by Howe's method. Similarity is transferred back to LR on the basis of an
inductively defined similarity. The translation from the call-by-need letrec
calculus into the extended call-by-name lambda calculus is the composition of
two translations. The first translation replaces the call-by-need strategy by a
call-by-name strategy and its correctness is shown by exploiting infinite trees
which emerge by unfolding the letrec expressions. The second translation
encodes letrec-expressions by using multi-fixpoint combinators and its
correctness is shown syntactically by comparing reductions of both calculi. A
further result of this paper is an isomorphism between the mentioned calculi,
which is also an identity on letrec-free expressions.Comment: 50 pages, 11 figure
Critical Media, Information, and Digital Literacy: Increasing Understanding of Machine Learning Through an Interdisciplinary Undergraduate Course
Widespread use of Artificial Intelligence in all areas of todayâs society creates a unique problem: algorithms used in decision-making are generally not understandable to those without a background in data science. Thus, those who use out-of-the-box Machine Learning (ML) approaches in their work and those affected by these approaches are often not in a position to analyse their outcomes and applicability. Our paper describes and evaluates our undergraduate course at the University of Minnesota Morris, which fosters understanding of the main ideas behind ML. With Communication, Media & Rhetoric and Computer Science faculty expertise, students from a variety of majors, most with no prior background in data science or computing, reviewed the scope of applicability of algorithms and became aware of possible biases, âpoliticsâ and pitfalls. After discussing articles on societal attitudes towards technology, explaining key concepts behind ML algorithms (training and dependence on data), and constructing a decision tree as an example of an algorithm, we attempted to develop guidelines for âbest practicesâ for use of algorithms. Students presented a âcase analysisâ capstone paper on an application of machine learning in society. Paper topics included: use of algorithms by child protection services, âdeepfakeâ videos, genetic testing. The level of papers was indicative of studentsâ strong interest in the subject and their ability to understand key terms and ideas behind algorithms, societal perception and misconceptions of use of algorithms, and their ability to identify good and problematic practices in use of algorithms
Extending Abramsky\u27s Lazy Lambda Calculus: (Non)-Conservativity of Embeddings
Our motivation is the question whether the lazy lambda calculus, a pure lambda calculus with the leftmost outermost rewriting strategy, considered under observational semantics, or extensions thereof, are an adequate model for semantic equivalences in real-world purely functional programming languages, in particular for a pure core language of Haskell. We explore several extensions of the lazy lambda calculus: addition of a seq-operator, addition of data constructors and case-expressions, and their combination, focusing on conservativity of these extensions. In addition to untyped calculi, we study their monomorphically and polymorphically typed versions. For most of the extensions we obtain non-conservativity which we prove by providing counterexamples. However, we prove conservativity of the extension by data constructors and case in the monomorphically typed scenario